-
Notifications
You must be signed in to change notification settings - Fork 72
chore: modernise library #156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
chore: modernise library #156
Conversation
- use pyproject.toml to define project metadata and setuptools-specific configuration, removing the need for a setup.py file - move sources in conventional src/ directory - rework stubs so that they're visible downstream. PEP-561 doesn't support top-level .pyi files, so I made a dummy package instead `alsaaudio-stubs`
|
Thank you. I appreciate it. I'm hoping to pull your changes, build locally and educate myself about uv.lock this week. |
|
Thanks! As for For consumers of the library, |
|
Is my understanding correct that the mypy, pyright and sphynx dependencies are purely for testing and generating documentation and do not impact deployment in any way? |
They're "dev" tools for linting/testing and doc generation, so they're not exposed to end users (the goal is still to have the library having zero runtime dependencies). However, as for "impact deployment", if we automate the process with CI, ideally we could use lint/tests to prevent deployment (or PR merge) if CI checks are red. This could also be done in a "soft" fashion (ie. PR checks only for indication, but the PR can still be merged even if red, and deployment doesn't run any additional linting). I didn't write any GitHub workflow to implement automatic deployment behavior yet (which up to now I believe was done purely manually?) |
My concerns were purely the install time and run time. I am not sure whether setting up CI is worth the effort. For two reasons I have no insight in the effort required and the amount of code changes on this repository seems moderate to me. Also I would expect that CI brings its own maintenance costs. Modernization of the build method is something becoming unavoidable soon, so I am happy you take it on. |
CI should deliberately be simple: the advantage is to let contributors put their (limited) time on actual development rather than boring devops. Moreover providing wheels gives the additional benefit to users to achieve true "zero runtime dependencies", since not only they wouldn't need ALSA headers installed nor a compiler toolchain, but they can also forgo installing Bulding wheels for manylinux by hand is definitely doable but not worth the effort, since manylinux requires some caveats about GLIBC version compatibility (and that's probably why I only see sdists on PyPI for I'll try to get some time this week to setup an as-simple-as-it-gets GitHub Workflow, then we can review if worth adopting. |
|
I forgot to mention: by CI I mostly mean documentation generation and PyPI publishing of sdists and wheels, i.e. the tasks that are boring but have direct impact on final users (affects how updated are the versions on PyPI and the online documentation). As "effort" is concerned, we could leave testing/linting out of the picture and just leave some indications on how to run them purely for the convenience of the developers. Not doing any CI around that would definitely reduce maintenance costs. For reference, I used |
That makes sense. There are no real tests that could be run on a github CI server (maybe this would even be possible, but I don't see how it could be done without
+1
Creating a github workflow would definitely be nice, but some concrete examples for generating artifacts (sdists, wheels, documentation etc.) would help me get a better understanding (or put it into the |
|
I made two GitHub workflows:
I guess the missing bit now is to write some minimal doc on how to build and publish to PyPI manually from local environment, but it pretty much boils down to running |
|
I installed uv/uvx a few days ago and ran @mttbernardini if you have an idea, let me know (I can also post full logs, but this was on a pretty standard raspian install on your branch) |
@larsimmisch unfortunately I'm not able to reproduce. I run the same command as yours (which pulls
For what is worth, my environment is Debian Trixie x86_64, though I doubt it matters, since the build itself runs on predefined docker images. |
|
@mttbernardini Good idea with the Is I'll try on an Intel machine this weekend. |
Ah I see, I'll try building on arm64 too and see.
|
I'm working on remodernising the library to:
.pyifiles are not supported.gh-pages, packaging and uploading to PyPI sdist and wheels (there's only sdist on PyPI currently), etc.This will be a metadata-rework only, no new feature added to the library itself nor breaking changes expected.
Opening the PR for any early feedback, but this is still WIP.